Blog posts tagged with 'how to optimize for baidu'

RSS
Duplicate Content Filter: What it is and how it works - Tuesday, September 06, 2011

Duplicate Content has become a huge topic of discussion lately, thanks to the new filters that search engines have implemented. This article will help you understand why you might be caught in the filter, and ways to avoid it. We'll also show you how you can determine if your pages have duplicate content, and what to do to fix it.

Search engine spam is any deceitful attempts to deliberately trick the search engine into returning inappropriate, redundant, or poor-quality search results. Many times this behavior is seen in pages that are exact replicas of other pages which are created to receive better results in the search engine. Many people assume that creating multiple or similar copies of the same page will either increase their chances of getting listed in search engines or help them get multiple listings, due to the presence of more keywords.

In order to make a search more relevant to a user, search engines use a filter that removes the duplicate content pages from the search results, and the spam along with it. Unfortunately, good, hardworking webmasters have fallen prey to the filters imposed by the search engines that remove duplicate content. It is those webmasters who unknowingly spam the search engines, when there are some things they can do to avoid being filtered out. In order for you to truly understand the concepts you can implement to avoid the duplicate content filter, you need to know how this filter works.

First, we must understand that the term "duplicate content penalty" is actually a misnomer. When we refer to penalties in search engine rankings, we are actually talking about points that are deducted from a page in order to come to an overall relevancy score. But in reality, duplicate content pages are not penalized. Rather they are simply filtered, the way you would use a sieve to remove unwanted particles. Sometimes, "good particles" are accidentally filtered out.

Knowing the difference between the filter and the penalty, you can now understand how a search engine determines what duplicate content is. There are basically four types of duplicate content that are filtered out:

  1. Websites with Identical Pages - These pages are considered duplicate, as well as websites that are identical to another website on the Internet are also considered to be spam. Affiliate sites with the same look and feel which contain identical content, for example, are especially vulnerable to a duplicate content filter. Another example would be a website with doorway pages. Many times, these doorways are skewed versions of landing pages. However, these landing pages are identical to other landing pages. Generally, doorway pages are intended to be used to spam the search engines in order to manipulate search engine results.
  2. Scraped Content - Scraped content is taking content from a web site and repackaging it to make it look different, but in essence it is nothing more than a duplicate page. With the popularity of blogs on the internet and the syndication of those blogs, scraping is becoming more of a problem for search engines.
  3. E-Commerce Product Descriptions - Many eCommerce sites out there use the manufacturer's descriptions for the products, which hundreds or thousands of other eCommerce stores in the same competitive markets are using too. This duplicate content, while harder to spot, is still considered spam.
  4. Distribution of Articles - If you publish an article, and it gets copied and put all over the Internet, this is good, right? Not necessarily for all the sites that feature the same article. This type of duplicate content can be tricky, because even though Yahoo and MSN determine the source of the original article and deems it most relevant in search results, other search engines like Google may not, according to some experts.

So, how does a search engine's duplicate content filter work? Essentially, when a search engine robot crawls a website, it reads the pages, and stores the information in its database. Then, it compares its findings to other information it has in its database. Depending upon a few factors, such as the overall relevancy score of a website, it then determines which are duplicate content, and then filters out the pages or the websites that qualify as spam. Unfortunately, if your pages are not spam, but have enough similar content, they may still be regarded as spam.

There are several things you can do to avoid the duplicate content filter. First, you must be able to check your pages for duplicate content. Using our Similar Page Checker, you will be able to determine similarity between two pages and make them as unique as possible. By entering the URLs of two pages, this tool will compare those pages, and point out how they are similar so that you can make them unique.

Since you need to know which sites might have copied your site or pages, you will need some help. We recommend using a tool that searches for copies of your page on the Internet: www.copyscape.com. Here, you can put in your web page URL to find replicas of your page on the Internet. This can help you create unique content, or even address the issue of someone "borrowing" your content without your permission.

Let's look at the issue regarding some search engines possibly not considering the source of the original content from distributed articles. Remember, some search engines, like Google, use link popularity to determine the most relevant results. Continue to build your link popularity, while using tools like www.copyscape.com to find how many other sites have the same article, and if allowed by the author, you may be able to alter the article as to make the content unique.

If you use distributed articles for your content, consider how relevant the article is to your overall web page and then to the site as a whole. Sometimes, simply adding your own commentary to the articles can be enough to avoid the duplicate content filter; the Similar Page Checker could help you make your content unique. Further, the more relevant articles you can add to compliment the first article, the better. Search engines look at the entire web page and its relationship to the whole site, so as long as you aren't exactly copying someone's pages, you should be fine.

If you have an eCommerce site, you should write original descriptions for your products. This can be hard to do if you have many products, but it really is necessary if you wish to avoid the duplicate content filter. Here's another example why using the Similar Page Checker is a great idea. It can tell you how you can change your descriptions so as to have unique and original content for your site. This also works well for scraped content also. Many scraped content sites offer news. With the Similar Page Checker, you can easily determine where the news content is similar, and then change it to make it unique.

Do not rely on an affiliate site which is identical to other sites or create identical doorway pages. These types of behaviors are not only filtered out immediately as spam, but there is generally no comparison of the page to the site as a whole if another site or page is found as duplicate, and get your entire site in trouble.

The duplicate content filter is sometimes hard on sites that don't intend to spam the search engines. But it is ultimately up to you to help the search engines determine that your site is as unique as possible. By using the tools in this article to eliminate as much duplicate content as you can, you'll help keep your site original and fresh.

Comments (0)
What is Robots.txt - Tuesday, September 06, 2011

Robots.txt

It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.

One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.

What Is Robots.txt?

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Structure of a Robots.txt File

The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File

When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.

Tools to Generate and Validate a Robots.txt File

Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:

User agent: *

Disallow: /temp/

this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.

In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.

Comments (0)
Optimizing for MSN - Tuesday, September 06, 2011

SEO experts often forget that there are three major search engines. While there is no doubt that Google is the number one with the most searches and Yahoo! manages to get about a quarter of the market, MSN has not retired yet. It holds about 10-15 percent of the searches (according to some sources even less – about 5%) but it has a loyal audience that can't be reached through the other two major search engines, so if you plan a professional SEO campaign, you can't afford to skip MSN. In a sense getting high rankings in MSN is similar to getting high rankings for less popular keywords – because competition is not that tough you might be able to get enough visitors from MSN only in comparison to the case when you have optimized for a more popular search engine.

Although optimizing for MSN is different from optimizing for Google and Yahoo!, there are still common rules that will help you to rank high in any search engine. As a rule, if you rank well in Google, chances are that you will rank well in Yahoo! (if you are interested in the tips and tricks for optimizing for Yahoo!, you want to have a look at the Optimizing for Yahoo! Article) and MSN as well. The opposite is not true, however. If you rank well in MSN, there is no guarantee that you'll do the same in Google. So, when you optimize for MSN, keep an eye on your Google ranking as well. It's no good to top MSN and be nowhere in Google (the opposite is more acceptable, if you need to make the choice).

But why is this so? The answer is simple - the MSN algorithm is different and that is why, even if the same pages were indexed, the search results will vary.

The MSN Algorithm

As already mentioned, it is the different MSN algorithm that leads to such drastic results in ranking. Otherwise, MSN, like all search engines, first spiders the pages on the Web, then indexes them in its database and after that applies the algorithm to generate the pages with the search results. So, the first step in optimizing for MSN is the same as for the other search engines – to have a spiderable site. (Have a look at Search Engine Spider Simulator to see how spiders see your site). If your site is not spiderable, then you don't have even a hypothetical chance to top the search results.

There is quite a lot of speculation about the MSN algorithm. Looking at the search results MSN delivers, it is obvious that its search algorithm is not as sophisticated as Google's, or even Yahoo!'s and many SEO experts agree that the MSN search algorithm is years behind its competitors. So, what can you do in this case? Optimize as you did for Google a couple of years ago? You are not far from the truth, though actually is is not that simple.

One of the most important differences is that MSN still relies heavily on metatags, as explained below. None of the other major search engines uses metatags that heavily anymore. It is obvious that metatags give SEO experts a great opportunity for manipulating search results. Maybe metatags are the main reason for the inaccurate search results that MSN often produces.

The second most important difference between MSN and the other major search engines is their approach to keywords. Well, for MSN keywords are very, very important, too, but unlike Google, for MSN onpage factors are dominating, while offpage factors (like backlinks for example), are still of minor importance. Well, it is a safe bet that the importance of backlinks will be changed in the future but for now they are not a primary factor for high rankings.

Keywords, Keywords, Keywords

It is hardly surprising that keywords are the most important item for MSN. What is surprising is that MSN relies too much on them. It is very easy to fool MSN – just artificially inflate your keyword density, put a couple of keywords in file names (and even better – in domain names) and around the top of the page and you are almost done for MSN. But if you do the above-mentioned black hat practices, your joy of topping MSN will not last long because, unless you provide separate pages that are optimized for Google, your stuffed pages might pretty well get you banned from Google. If you decide to have separate pages for Google and MSN, first, it it hardly worth the trouble, and second, the risk of duplicate content penalty can't be ignored.

So, what is the catch? The catch is that if you try to polish your site for MSN and stuff it with keywords, this might get you into trouble with Google, which certainly is worse than not ranking well in MSN. But if you optimize wisely, it is more likely than not that you will rank decently in Google and perform well in Yahoo! and MSN as well.

Metatags

Having meaningful metatags never hurts but with MSN this is even more important because its algorithm still uses them as a primary factor in calculating search results. Having well-written (not stuffed) metatags will help you with MSN and some other minor search engines, while at the same time well-written metatags will not get you banned from Google.

The Description metatag is very important:

<META NAME=”Description” CONTENT=”Place your description here” />

MSNBot reads its content and based on that (in addition to keywords found on page) judges how to classify your site. So if you leave this tag empty (i.e. CONTENT=””), you have missed a vital chance to be noticed by MSN. There is no evidence that MSN uses the other metatags in its algorithm that is why leaving the Description metatag empty is even more unforgivable.

Comments (0)
Choosing SEO as Your Career - Tuesday, September 06, 2011

Its always better to know in advance what you can expect from a career in SEO.

Some Good Reasons to Choose SEO as Your Career

1High demand for SEO services

Once SEO was not a separate profession – Web masters performed some basic SEO for the sites they managed and that was all. But as sites began to grow and make money, it became more reasonable to hire a dedicated SEO specialist than to have the Web master do it. The demand for good SEO experts is high and is constantly on the rise.

2A LOT of people have made a successful SEO career

There are many living proofs that SEO is a viable business. The list is too long to be quoted here but some of the names include Rob from Blackwood Productions, Jill Wahlen from High Rankings, Rand Fishkin from SEO Moz and many others.

3Search Engine Optimizers make Good Money !

SEO is a profession that can be practiced while working for a company or as a solo practitioner. There are many jobboards like Dice and Craigslist that publish SEO job advertisements. It is worth noting that the compensation for SEO employees is equal to or even higher than that of developers, designers and marketers. Salaries over $80K per annum are not an exception for SEO jobs.
As a solo SEO practitioner you can make even more money. Almost all freelance sites have sections for SEO services and offers for $50 an hour or more are quite common. If you are still not confident that you can work on your own, you can start a SEO job, learn a bit and then start your own company.
If you already feel confident that you know a lot about SEO, you can take this quiz and see how you score. Well, don't get depressed if you didn't pass – here is a great checklist that will teach you a lot, even if you are already familiar with SEO.

4Only Web–Designing MAY NOT be enough

Many companies offer turn–key solutions that include Web design, Web development AND SEO optimization. In fact, many clients expect that when they hire somebody to make their site, the site will be SEO friendly, so if you are good both as a designer and a SEO expert, you will be a truely valuable professional.
On the other hand, many other companies are dealing with SEO only because they feel that this way they can concentrate their efforts on their major strentgh – SEO, so you can consider this possibility as well.

5Logical step ahead if you come from marketing or advertising

The Web has changed the way companies do business, so to some extent today's marketers and advertisers need to have at least some SEO knowledge if they want to be successful. SEO is also a great career for linguists.

6Lots of Learning

For somebody who comes from design, development or web administration, SEO might look not technical enough and you might feel that you will downgrade if you move to SEO. Don't worry so much – you can learn a LOT from SEO, so if you are a talented techie, you are not downgrading but you are actually upgrading your skills packages.

7SEO is already recognized as a career

Finally, if you need some more proof that SEO is a great career, have a look at the available SEO courses and exams for SEO practitioners. Well, they might not be a CISCO certification but still they help to institutionalize the SEO profession.

Some Ugly Aspects of SEO

1Dependent on search engines

It is true that in any career there are many things that are outside of your control but for SEO this is a rule number one. Search engines frequently change their algorithms and what is worse – these changes are not made public, so even the greatest SEO gurus admit that they make a lot of educated guesses about how things work. It is very discouraging to make everything perfect and then to learn that due to a change in the algorithm, your sites dropped 100 positions down. But the worst part is that you need to communicate this to clients, who are not satisfied with their sinking ratings.

2No fixed rules

Probably this will change over time but for now the rule is that there are no rules – or at least not written ones. You can work very hard, follow everything that looks like a rule and still success is not coming. Currently you can't even rely on bringing a search engine to court because of the damages they have done to your business because search engines are not obliged to rank high sites that have made efforts to get optimized.

3Rapid changes in rankings

But even if you somehow manage to get to the top for a particular keyword, keeping the position requires constant efforts. Well, many other businesses are like that, so this is hardly a reason to complain – except when an angry customer starts shouting at you that this week their ratings are sinking and of course this is all your fault.

4SEO requires Patience

The SEO professional and customers both need to understand that SEO takes constant effort and time. It could take months to move ahead in the ratings, or to build tens of links. Additionally, if you stop optimizing for some time, most likely you will experience a considerable drop in ratings. You need lots of motivation and patience not to give up when things are not going your way.

5Black hat SEO

Black hat SEO is probably one of the biggest concerns for the would–be SEO practitioner. Fraud and unfair competition are present in any industry and those who are good and ethical suffer from this but black hat SEO is still pretty widespread. It is true that search engines penalize black hat practices but still black hat SEO is a major concern for the industry.

So, let's hope that by telling you about the pros and cons of choosing SEO as your career we have helped you make an informed decision about your future.

Comments (0)
HTML 5 and SEO - Tuesday, September 06, 2011

HTML 5 is still in the making but for any SEO expert, who tries to look ahead, some knowledge about HTML 5 and how it will impact SEO is not unnecessary information. It is true that the changes and the new concepts in HTML 5 will impact Web developers and designers much more than SEO experts but still it is far from the truth to say that HTML 5 will not mean changes in the Organic SEO policy.

What's New in HTML 5?

HTML 5 follows the way the Net evolved in the last years and includes many useful tags and elements. At first glance, it might look as if HTML 5 is going in the direction of a programming language (i.e. PHP) but actually this is not so – it is still an XML–based presentation language. The new tags and elements might make HTML 5 look more complex but this is only at first glance.

HTML 5 is not very different from HTML 4. One of the basic ideas in the development of HTML 5 was to ensure backward compatibility and because of that HTML 5 is not a complete revamp of the HTML specification. So, if you had worries that you will have to start learning it from scratch, these worries are groundless.

How the Changes in HTML 5 Will Affect SEO?

As a SEO expert, you are most likely interested mainly in those changes in the HTML 5 specification, which will affect your work. Here are some of them:

  • Improved page segmentation. Search engines are getting smarter and there are many reasons to believe that even now they are applying page segmentation. Basically, page segmentation means that a page is divided into several separate parts (i.e. main content, menus, headers, footers, links sections, etc.) and these parts are treated as separate entries. At present, there is no way for a Web master to tell search engines how to segment a page but this is bound to change in HTML 5.

  • A new <article> tag. The new <article> tag is probably the best addition from a SEO point of view. The <article> tag allows to mark separate entries in an online publication, such as a blog or a magazine. It is expected that when articles are marked with the <article> tag, this will make the HTML code cleaner because it will reduce the need to use <div> tags. Also, probably search engines will put more weight on the text inside the <article> tag as compared to the contents on the other parts of the page.

  • A new <section> tag. The new <section> tag can be used to identify separate sections on a page, chapter, book. The advantage is that each section can have its separate HTML heading. As with the <article> tag, it can be presumed that search engines will pay more attention to the contents of separate sections. For instance, if the words of a search string are found in one section, this implies higher relevance as compared to when these words are found all across the page or in separate sections.

  • A new <header> tag. The new <header> tag (which is different from the head element) is a blessing for SEO experts because it gives a lot of flexibility. The <header> tag is very similar to the <H1> tag but the difference is that it can contain a lot of stuff, such as H1, H2, H3 elements, whole paragraphs of text, hard–coded links (and this is really precious for SEO), and any other kind of info you feel relevant to include.

  • A new <footer> tag. The <footer> tag might not be as useful as the <header> one but still it allows to include important information there and it can be used for SEO purposes as well. The <header> and <footer> tags can be used many times on one page – i.e. you can have a separate header/footer for each section and this gives really a lot of flexibility.

  • A new <nav> tag. Navigation is one of the important factors for SEO and everything that eases navigation is welcome. The new <nav> tag can be used to identify a collection of links to other pages.

As you see, the new tags follow the common structure of a standard page and each of the parts (i.e. header, footer, main section) has a separate tag. The tags we described here, are just some (but certainly not all) of the new tags in HTML 5, which will affect SEO in some way. For instance, <audio>, <video> or <dialogue> tags are also part of the HTML 5 standard and they will allow to further separate the content into the adequate categories. There are many other tags but they are of relatively lower importance and that is why they are not discussed.

For now HTML 5 is still far in the future. When more pages become HTML 5–compliant, search engines will pay more attention to HTML 5. Only then it will be possible to know how exactly search engines will treat HTML 5 pages. The mass adoption of HTML 5 won't happen soon and it is a safe bet to say that for now you can keep to HTML 4 and have no concerns. Additionally, it will take some time for browsers to adjust to HTML 5, which further delays the moment when HTML 5 will be everywhere.

However, once HTML 5 is accepted and put to use, it will be the dominating standard for the years to come and that is why you might want to keep an eye on what other web masters are doing, just to make sure that you will not miss the moment when HTML 5 becomes the defacto standard.

Comments (0)
YouTube Traffic - Tuesday, September 06, 2011

YouTube is one of the most popular sites and in addition to all the fun there, YouTube offers many opportunities for promotion and getting traffic to your site. Similarly to Facebook and Twitter, in order to use YouTube successfully for promotion and getting traffic, you need to know the rules for this. Here are some tips how to promote yourself, your site, and your products and how to get free traffic from YouTube:

    1 Post viral videos

    There are millions of videos on YouTube. If you post a video nobody is interested in, this video will go unnoticed, as millions of other videos. The clue to getting traffic from YouTube is to post useful videos, or even better – viral videos. Viral videos are not only useful videos, but they also tend to appeal to large groups of people. If your video manages to get viral, people will promote it for you and the only thing left for you is to reap the benefits.

    2 Create an interesting profile

    Similarly to Facebook, Twitter, or any other social networking site, an interesting profile is a must. If people like your videos, they will check your profile to learn more about you. When they see that your profile is boring, they won't bother more with you. You can make your profile a bit informal but don't make it as if it were the profile of a crazy teenager – you are using YouTube for business, right?

    3 Include your logo and website in the video

    Your logo and your website URL are your major branding weapons. This is why you must include them in the video. You can include them in the beginning of the video or at the end. It is best to have your logo and URL throughout the whole video because this way you will be gaining lots of exposure but if you can't do it (for instance because of artistic considerations), the beginning and the end of the video will suffice.

    4 Post quality videos

    As already mentioned, there is no shortage of videos on YouTube. Unfortunately, this also means there is no shortage of videos with poor quality. These videos are not favored by viewers, so if you want viewers to watch your videos, make sure that your videos don't have crappy sound and/or blurred pictures. YouTube is not a board for professional videographers, so you can post amateur videos, but make sure their quality is decent.

    5 Promote your videos

    If your videos get viral, you are lucky but you can't count on this. In order to get YouTube traffic, your videos need viewers. You can't rely solely on the fact that viewers will find your videos – you need to promote them. Even viral videos will benefit from a promotion by you.

    6 Make your videos search-friendly

    One of the ways viewers find your videos is through search – both locally on YouTube and on search engines. This is why you need to make your videos search-friendly. To do this, include your major keywords in the title and in the descriptions. Also, pay special attention to the tags. List as many keywords as relevant in the tags, but beware that you don't get spammy.

    7 Post in series

    Standalone videos can become a hit but it is best if you create series of videos and post them once a day/week. This way viewers will know that there will be more and they will be coming to check. Even if you don't create series, at least try to post videos regularly – this builds audience loyalty.

    8 Post video responses

    Video responses are one of the unique things about YouTube and you should take full advantage of it. Search your niche, choose the most popular videos in your niche and post video responses to them. Just be careful that the response you post is related to the video you are responding to and don't make your video response a blatant self-promotion.

    9 Choose the right time to post your videos

    On YouTube, timing is very important because there are peaks in traffic and times when there are not so many viewers. Weekdays (especially Wednesdays and above all - Thursdays) morning or early afternoon US time is the best time to post a general interest video. In order to have your video uploaded in the prime time, you need to plan a bit. Have in mind that for large videos and/or slow Internet connections the upload could take you an hour, so start early.

    10 Keep your videos short

    YouTube doesn't impose limits on the length of videos it publishes but generally long videos are boring. 3 to 5 minutes is the best duration for a video but if required you could go from 1 to 6 minutes. When a video is longer than 6 or 7 minutes, this gets boring and not many people will watch it to the end (where your logo and URL are to be found). 3 to 5 minutes is enough to lay your idea, give some details AND tell viewers to visit your site for more.

    11 Comment on other people's videos and include a link to your site in your comment

    In addition to video responses, you can also use plain good comments. Again, search for popular videos in your niche and comment on them. If your comments are liked by viewers, they will check your profile and probably watch your videos.

    YouTube is a valuable resource to drive traffic to your site and to promote it. The competition there might be fierce, but there is always room for a couple of good videos. Fill this room before your competitors do!

Comments (0)
How to Pick an SEO Friendly Designer - Monday, September 05, 2011
It is very important to hire a SEO-friendly designer because if you don't and your site is designed in a SEO-unfriendly fashion, you can't compensate for this later. This article will tell you how to pick a SEO-friendly designer and save yourself the disappointment of low rankings with search engines.

A Web designer is one of the persons without whom it is not possible to create a site. However, when SEO is concerned, Web designers can be really painful to deal with. While there are many Web designers, who are SEO-proficient, it is still not an exception to stumble upon design geniuses, who are focused only on the graphic aspect of the site. For them SEO is none of their business and they couldn't care less for something as unimportant as good rankings with search engines. Needless to say, if you hire such a designer, don't expect that your site will rank well with search engines.

If you will do SEO on your own, then you might not care a lot about the SEO skills of your Web designer but still there are design issues as we'll see next, which can affect your rankings very badly. When he or she designs the site against SEO rules, then it is not possible to fix this with SEO tricks.

When we say that you need to hire a SEO-friendly designer, we presume that you are a SEO pro and you know SEO but if you aren't, then have a look at the SEO Tutorial and the SEO Checklist. If you have no idea about SEO, then you will hardly be able to select a SEO-friendly designer because you won't know what to look for.

One of the ultimate tests if a designer is SEO-friendly or not is to look at his or her past sites – are they done professionally, especially in the SEO department. If their past sites don't exhibit blatant SEO mistakes, such as the ones we'll list in a second and they rank well, this is a recommendation that this person is worth hiring. Anyway, after you look at past sites, ask the designer if he or she did the SEO for their past sites because in some cases it might be that the client himself or herself has done a lot to optimize the site and this is why the site ranks well.

Here is a checklist of common web design sins that will make your site a SEO disaster. If you notice any or all of the following in the past sites your would-be designer has created, just move to the next designer. These SEO-unfriendly design elements are absolute sins and unless the client made them do it, no designer who would use the below techniques deserves your attention:

1 Rely heavily on Flash

Many designers still believe that Flash is the next best thing after sliced bread. While Flash can be very artistic and make a site look cool (and load forever in the browser), heavily Flash-ed sites are disaster in terms of SEO. Simple HTML sites rank better with search engines and as we point out in Optimizing Flash Sites, if the use of Flash is a must, then an HTML version of the same page is more than mandatory.

2 No internal links, or very few links

Internal links are backlinks and they are very important. Of course, this doesn't mean that all the text on a page must be hyperlinked to all the other pages on the site but if there are only a couple of internal links a page, this is a missed chance to get backlinks.

3 Images, not text for anchors

This is another frequent mistake many designers make. Anchor text is vital in SEO and when your links lack anchor text, this is bad. It is true that for menu items and other page elements, it is much easier to use an image than text because with text you can never be sure it will display correctly on users' screens, but since this is impacting your site's rankings in a negative way, you should sacrifice beauty for functionality.

4 Messy code and tons of code

If you have no idea about HTML, then it might be impossible for you to judge if a site's code is messy and if the amount of code is excessive but cleanness of code is an important criterion for SEO. When the code is messy, it might not be spiderable at all and this can literally exclude your site from search engines because they won't be able to index it.

5 Excessive use of (SEO non-friendly) JavaScript

Similarly to Flash, search engines don't love JavaScript, especially tons of it. Actually, the worst with JavaScript is that if not coded properly, it is quite possible that because of the use of JavaScript your pages (or parts of them) are not spiderable, which automatically means that they won't be indexed.

6 Overoptimized sites

Overoptimized sites aren't better than under-optimized. In fact, they could be much worse because when you keyword stuff and use other techniques (even when they are not Black Hat SEO) to artificially inflate the rankings of the site, this could get you banned from search engines and this is the worst that can happen to a site.

7 Dynamic and other SEO non-friendly URLs

Well, maybe dynamic URLs is not exactly a design issue but if you are getting a turn-key site - i.e. it is not up to you to upload and configure it and to create the links inside - then dynamic URLs are bad and you have to ask the designer/developer not to use them. You can rewrite dynamic and other SEO non-friendly URLs on your own but actually this means to make dramatic changes to the site and this is hardly the point of hiring a designer.

These points are very important and this is why you need to follow them, when you are choosing a SEO-friendly designer. Some of the items on the list are so bad for SEO (i.e. Flash, JavaScript) that even if the site is a design masterpiece and you promote it heavily, you will still be unable to get decent rankings. SEO-friendliness of design is a necessity, not a whim and you shouldn't settle for a SEO-unfriendly designs – this can be really expensive!

Comments (0)
How to Optimize your Website for Mobile Search - Monday, September 05, 2011
mobile search is different from desktop search and if you have lots of mobile visitors, you need to make your site mobile-friendly. Shorter keywords, shorter pages, current info, and compliance with mobile standards are some of the key points to follow in order to make your site suitable for mobile searchers.

It is not only web designers and developers, who need to adapt to webcomake changes to their strategies and tactics, if they want to capture the lucrative mobile search market. Mobile search is a constantly growing segment of the market, which is good news. However, mobile search has its own rules and they are kind of different from the rules of traditional desktop search. This is why if you don't want to miss mobile searchers, you need to adapt to their requirements. Here are some very important rules to consider when optimizing for mobile search:

1 Mobile Searchers Use Shorter Key phrases/Keywords

Mobile users search for shorter keyphrases, or even just for keywords. Even mobile devices with QWERTY keyboards are awkward for typing long texts and this is the reason why mobile searchers usually are very brief in their search queries. Very often the search query is limited to only 2 or even 1 words. As a result, if you don't rank well for shorter keyphrases (unfortunately, they are also more competitive), then you will be missing a lot of mobile traffic.

2 Mobile Search Is Mainly Local Search

Mobile users search mostly for local stuff. In addition to shorter search keyphrases, mobile searchers are also locally targeted. It is easy to understand - when a user is standing in the street and is looking for a place to dine, he or she is most likely looking for things in the neighborhood, not in another corner of the world. Searches like “pizza 5th Avenue” are quite popular, which makes local search results even more important to concentrate on.

3 Current Data Rules in Mobile Search

Sports results, news, weather, financial information are among the most popular mobile search categories. The main topics and niches mobile users prefer are kind of limited but again, they revolve around places to eat or shop in the area, sports results, news, weather conditions, market information, and other similar topics where timing and location are key. If your site is in one of these niches, then you really need to optimize it because if your site is not mobile-friendly chances are you are losing visitors. You could even consider having two separate versions of your site – one for desktop searchers and one for mobile searchers.

4 In Mobile Search, Top 10 Is Actually Top 3

Users hate to scroll down long search pages or hit Next, Next, Next. Desktop searchers aren't fond of scrolling endless pages either but in mobile search the limitations are even more severe. A page with 10 search results fits on the screen of a desktop but on a mobile device it might be split into 2 or more screens. Therefore, in mobile search, it is not Top 10, it is more Top 4, or even Top 3 because only the first 3 or 4 positions are on the first page and have a higher chance to attract the user's attention without having to go to the next page.

5 Promote Your Mobile-Friendly Site

Submit your site to major mobile search engines, mobile portals, and directories. It is great if your visitors come from Google and the other major search engines but if you want to get even more traffic, mobile search engines, mobile portals, and directories are even better. For now these mobile resources work great to bring mobile traffic, so don't neglect them. Very often a mobile user doesn't search with Google, but goes to a portal he or she knows. If your site is listed with this portal, the user will come directly to you from there, not from a search engine. The case with directories is similar – i.e. if you are optimizing the site of a pizza restaurant, then you should submit it to all directories where pizza restaurants and restaurants in general for your location are listed.

6 Follow Mobile Standards

Mobile search standards are kind of different and if you want your site to be spiderable, you need to comply with them. Check the guidelines of W3C to see what mobile standards are. Even if your site doesn't comply with mobile standards, it will still be listed in search results but it will be transcoded by the search engine and the result could be pretty shocking to see. Transcoders convert sites to a mobile format but this is not done in a sophisticated manner and the output might be really unbelievable – and everything but mobile-friendly.

7 Don't Forget Meta.txt

Meta.txt is a special file, where you briefly describe the contents of your site and point the user agent to the most appropriate version for it. Search engine spiders directly index the meta.txt file (provided it is located in the root directory), so even if the rest of your site is not accessible, you will still be included in search results. Meta.txt is similar to robots.txt in desktop search but it also has some similarity with metatags because you can put content it it (as you do with the Description and Keywords metatags). The format of the meta.txt file is colon delimited (as is the format of robots.txt). Each field in the file has the following syntax form <fieldname>:<value>. One of the advantages of meta.txt is that it is easily parsed by humans and search engines.

8 No Long Pages for Mobile Searchers

Use shorter texts because mobile users don't have the time to read lengthy pages. We already mentioned that mobile searchers don't like lengthy keyphrases. Well, they like lengthy pages even less! This is why, if you can make a special, shorter mobile version of your site, this would be great. Short pages don't mean that you should skip your keywords, though. Keywords are really vital for mobile search, so don't exclude them but don't keyword stuff, either.

9 Predictive Search Is Popular With Mobile Searchers

Use phrases, which are common in predictive search. Predictive search is also popular with mobile searchers because it saves typing effort. This is why, if your keywords are among the common predictive search results, this seriously increases your chances to be found. It is true that predictive search keywords change from time to time and you can't always follow them but you should at least give it a try.

10 Preview Your Site on Mobile Devices

Always check how your site looks on a mobile device. With the plethora of devices and screen sizes it is not possible to check your site on absolutely every single device you can think of, but if you can check it at least on a couple of the most important ones, this is more than nothing. Even if you manage to get visitors from mobile search engines, if your site is shown distorted on a mobile screen, these visitors will run away. Transcoding is one reason why a site gets distorted, so it is really a good idea to make your site mobile-friendly instead of to rely on search engines to transcode it and make it a design nightmare in the process.

Mobile search is relatively new but it is a safe bet that it will get a huge boost in the near future. If you are uncertain whether your particular site deserves to be optimized for mobile devices or not, use AdWords Keyword Research Tool to track mobile volumes for your particular keywords. If the volumes are high, or if a particular keyword is doing remarkably well in the mobile search segment, invest more time and effort to optimize for it.

Comments (0)
How to Optimize for Baidu - Monday, September 05, 2011
Baidu is the most popular search engine in China, more popular than Google itself. This is why, if you have visitors from China, it makes sense to optimize your site for Baidu as well. The rules for ranking well with Baidu are similar to the rules of the other search engines, yet there are differences, as we show in the article.

Usually SEO efforts are directed towards achieving top rankings with Google and sometimes with Yahoo and Bing. However, in addition to the Big Three, there are also other search engines that might be of interest to you. In fact, some of these search engines might prove a better option than Google, Yahoo or Bing. If these search engines are used by your target audience, they will be more efficient and it is worth to spend some time optimizing for them.

If you haven't heard about Baidu, don't worry. It is a popular search engine but its reach is not global and this is why many people don't even know about it. Still, Baidu is certainly not just one more search engine to waste your time with. Baidu is big in China and since the population of China is more than a billion, if you rank well with Baidu, this can make quite a difference. In fact, if you are operating globally, not to mention if your visitors are based mainly in China, you can't afford to miss this market. On the Chinese market, the share of Baidu is around 60% and it is the most popular Chinese language search engine. Google is less popular in China than Baidu, so if your traffic comes from the Chinese market, it pays to optimize your site for Baidu.

The algorithm of Baidu is different from the algorithms of Google, Bing, and Yahoo. In a sense, it is less sophisticated and it kinda resembles the algorithms of the other search engines from many years many years ago. Here are some tips what you should do in order to get decent rankings with Baidu:

1 Find the right Chinese keywords

Of course, as with other search engines, keywords are important for good rankings. You need to find the right Chinese keywords to optimize for. This might be a challenge because Chinese has many dialects and the same words have different meanings in different dialects. However, Pinyin Chinese is preferred by Baidu and this is why Pinyin Chinese is your best choice. You should stick to it not only for your keywords but for your content as a whole.

2 You need LOTS of content in Chinese

Even if Chinese is not the official language of your site, you need to have many pages in Chinese. With Baidu, content is king – the same as with other search engines. When generating tons of content in Chinese, follow the official guidelines for what content is acceptable in China because there are strict rules there and if you don't obey, this could cost you not only your good rankings with Baidu.

3 Metatags weight a lot

Similarly to the early days of the other search engines, with Baidu metatags are very important, so don't forget to make your metatags top-notch. However, don't abuse metatags and don't stuff them with keywords.

4 Get a Pinyin Chinese domain name and host your site on a Chinese host

Domain names are important with Baidu as well. In addition to having keywords in your domain name, you need to have a domain name in Pinyin Chinese. You can use a .com, .net, or .cn extension with it. For even better results, host your site on a Chinese host because this gives you an additional bonus with Baidu.

5 Use simple navigation structures

Simple navigation structures are a must with every search engine but for Baidu they matter even more. Baidu won't follow links that are deeply buried in all kinds of messy code or that go many levels deep in the site hierarchy.

6 Watch for duplicate content

Baidu is very strict about duplicate content. With the other search engines you might also have problems, if you have duplicate content but Baidu is even less tolerant. Use a robots.txt to tell what not to index and you are safe.

7 No links to bad neighbors and no link farms

Linking to bad neighbors and getting links from link farms isn't a good idea with any search engine but penalties with Baidu are even more severe, so you need to consider this. Also, don't put too many outbound links on your site because this also affects your Baidu rankings in a negative way.

8 Plan in advance

In Google you can get top rankings for a week (though this certainly isn't the norm and we don't mean you should use blackhat strategies to achieve it) but with Baidu success doesn't come that fast. In Baidu it can take 6 months or more to achieve the good rankings you will achieve in Google overnight and you need to take this into account. For instance, if you are promoting a summer-related site, you should start optimization not later than November, so that when the season comes, your site will have achieved the rankings you want.

9 Baidu doesn't deal with Flash and JavaScript

Flash and JavaScript aren't Google's favorites but Baidu absolutely hates them. This is why you should use Flash and JavaScript only if you provide altenative HTML versions of the content you have incorporated in the Flash/JavaScript. Baidu doesn't like iFrames either, so avoid their use as well.

10 Make sure your site is spiderable

As any other search engine, Baidu uses crawlers, so make sure your site is spiderable. Use a spider simulator to check what is accessible from your site and what isn't.

As you see, optimization for Baidu isn't totally different from optimization for any other search engine but it certainly has its specifics. Follow the rules, be patient and sooner or later success will come to you. When you are done with your Baidu optimization, it won't be a surprise, if your site starts ranking better with Google as well, especially for country-specific searches. When you have so much content in Chinese and a Chinese domain name, this will inevitably help you to achieve better rankings for your Chinese search terms in any other search engine.

Comments (0)
SEO Musts for Local Business - Monday, September 05, 2011

When you are doing business locally, you need local traffic. Maybe you are asking yourself how this is possible, since search engines are global in nature. Read the article and you will learn what you can do to get targeted local traffic to your site.

The Internet might be global in nature, but if your business is local, it makes no sense to concentrate on global reach, when your customers live in your city, or even in your neighborhood. For local businesses getting a global reach is a waste of resources. Instead, you should concentrate on the local community. You might be asking how you can do it, when the Web is global and Google doesn't classify sites according to their location. Here is how you can go local with SEO:

1 Use your location in your keywords.

The first trick is to use your location in your keywords. For example, if you are in London and you sell car insurance, your most important keyphrase should be “car insurance London” because this keyphrase contains your business and your location and will drive people who are looking for car insurance in London in particular.

2 Use your location in metatags

Metatags matter for search engines and you shouldn't miss to include your location, together with your other keywords in the metatags of the pages of your site. Of course, you must have your location in the keywords you use in the body text because otherwise it is a bit suspicious when your body text doesn't have your location as a keyword but your tags are stuffed with it.

3 Use your location in your body text

Keywords in the body text count a lot and you can't afford to skip them. If your web copy is optimized for “car insurance” only, this won't help you rank well with “car insurance London”, so make sure that your location is part of your keywords.

4 Take advantage of Google Places and Yahoo Local

Google Places and Yahoo Local are great places to submit to because they will include you in their listings for a particular location.

5 Create backlinks with your location as anchor text

It could be a bit tricky to get organic backlinks with your location as anchor text because some keywords with location don't sound very natural – for instance, “car insurance London” isn't grammatically correct and you will hardly get an organic inline link with it but you can use it in the Name field to comment on blogs. If the blog is dofollow, you will still get a backlink with anchor text that helps for SEO.

6 Get included in local search engine

Global search engines, such as Google, Bing, or Yahoo can bring you lots of traffic but depending on your location, local search engines might be the real golden mine. A local search engine could mean a search engine for the area (though it is not very likely to have regional search engines) or more likely for your country. For instance, Baidu is a great option, if you are selling on the Chinese market.

7 Get listed in local directories

In addition to local search engines, you need to try your luck with local directories, too. You might think that nobody reads directory listings but this isn't exactly so. For instance, Yellow Pages are one of the first places where people look when searching for a local vendor for a particular product.

8 Run locally-targeted ad campaigns

One of the most efficient ways to drive targeted, local traffic to your site is with the help of locally-targeted ad campaigns. PPC ads and classifieds are the two options that work best – at least for most webmasters.

9 Do occasional checks of your keywords

Occasionally checking the current search volume of your keywords is a good idea because shifts in search volumes are quite typical. Needless to say, if people don't search for “car insurance London” anymore because they have started using other search phrases and you continue to optimize for “car insurance London”, this is a waste of time and money. Also, keep an eye on the keywords your competitors use – this will give you clue which keywords work and which don't.

10 Use social media

Social media can drive more traffic to a site than search engines and for local search this is also true. Facebook, Twitter, and the other social networking sites have a great sales potential because you can promote your business for free and reach exactly the people you need. Local groups on social sites are especially valuable because the participants there are mainly from the region you are interested in.

11 Ask for reviews and testimonials

Client reviews and testimonials are a classical business instrument and these are like letters of recommendation for your business. However, as far as SEO is concerned, they could have another role. There are review sites, where you can publish such reviews and testimonials (or ask your clients to do it) and this will drive business to you. Some of these sites are Yelp and Merchant Circle but it is quite probable that there are regional or national review sites you can also post at.

12 Create separate pages for your different locations

When you have business in several locations, this makes the task a bit more difficult because you can't possibly optimize for all of them – you can't have a keyphrase such as “car insurance London, Berlin, Paris, New York”. In this case the solution is to create separate pages for your different locations. If your locations span the globe, you can also create different sites on different, country-specific domains (i.e. uk.co for GB, .de for Germany, etc.) but this is only reasonable to do, if your business is truly multinational. Otherwise, just a separate page for each of your locations will do.

These simple tips how to optimize your site for local searches are a must, if you rely on the local market. Maybe you are already doing some of them and you know what works for you and what doesn't. Anyway, if you haven't tried them all, try them now and see if this will have a positive impact on your rankings (and your business) or not.

Comments (0)
LiveZilla Live Help